44 research outputs found

    Risicoscreening van geriatrische patienten bij ziekenhuisopname met een clinical rule op basis van het HARM-onderzoek

    Get PDF
    Risicoscreening van de geriatrische patient bij opname met behulp van een clinical rule op basis van de reslutaten van het HARM-onderzoek

    Pathways for avian influenza virus spread: GPS reveals wild waterfowl in commercial livestock facilities and connectivity with the natural wetland landscape

    Get PDF
    Zoonotic diseases are of considerable concern to the human population and viruses such as avian influenza (AIV) threaten food security, wildlife conservation and human health. Wild waterfowl and the natural wetlands they use are known AIV reservoirs, with birds capable of virus transmission to domestic poultry populations. While infection risk models have linked migration routes and AIV outbreaks, there is a limited understanding of wild waterfowl presence on commercial livestock facilities, and movement patterns linked to natural wetlands. We documented 11 wild waterfowl (three Anatidae species) in or near eight commercial livestock facilities in Washington and California with GPS telemetry data. Wild ducks used dairy and beef cattle feed lots and facility retention ponds during both day and night suggesting use for roosting and foraging. Two individuals (single locations) were observed inside poultry facility boundaries while using nearby wetlands. Ducks demonstrated high site fidelity, returning to the same areas of habitats (at livestock facilities and nearby wetlands), across months or years, showed strong connectivity with surrounding wetlands, and arrived from wetlands up to 1251 km away in the week prior. Telemetry data provides substantial advantages over observational data, allowing assessment of individual movement behaviour and wetland connectivity that has significant implications for outbreak management. Telemetry improves our understanding of risk factors for waterfowl–livestock virus transmission and helps identify factors associated with coincident space use at the wild waterfowl–domestic livestock interface. Our research suggests that even relatively small or isolated natural and artificial water or food sources in/near facilities increases the likelihood of attracting waterfowl, which has important consequences for managers attempting to minimize or prevent AIV outbreaks. Use and interpretation of telemetry data, especially in near-real-time, could provide key information for reducing virus transmission risk between waterfowl and livestock, improving protective barriers between wild and domestic species, and abating outbreaks

    Waterfowl recently infected with low pathogenic avian influenza exhibit reduced local movement and delayed migration

    Get PDF
    Understanding relationships between infection and wildlife movement patterns is important for predicting pathogen spread, especially for multispecies pathogens and those that can spread to humans and domestic animals, such as avian influenza viruses (AIVs). Although infection with low pathogenic AIVs is generally considered asymptomatic in wild birds, prior work has shown that influenza-infected birds occasionally delay migration and/or reduce local movements relative to their uninfected counterparts. However, most observational research to date has focused on a few species in northern Europe; given that influenza viruses are widespread globally and outbreaks of highly pathogenic strains are increasingly common, it is important to explore influenza–movement relationships across more species and regions. Here, we used telemetry data to investigate relationships between influenza infection and movement behavior in 165 individuals from four species of North American waterfowl that overwinter in California, USA. We studied both large-scale migratory and local overwintering movements and found that relationships between influenza infection and movement patterns varied among species. Northern pintails (Anas acuta) with antibodies to avian influenza, indicating prior infection, made migratory stopovers that averaged 12 days longer than those with no influenza antibodies. In contrast, greater white-fronted geese (Anser albifrons) with antibodies to avian influenza made migratory stopovers that averaged 15 days shorter than those with no antibodies. Canvasbacks (Aythya valisineria) that were actively infected with influenza upon capture in the winter delayed spring migration by an average of 28 days relative to birds that were uninfected at the time of capture. At the local scale, northern pintails and canvasbacks that were actively infected with influenza used areas that were 7.6 and 4.9 times smaller than those of uninfected ducks, respectively, during the period of presumed active influenza infection. We found no evidence for an influence of active influenza infection on local movements of mallards (Anas platyrhynchos). These results suggest that avian influenza can influence waterfowl movements and illustrate that the relationships between avian influenza infection and wild bird movements are context- and species-dependent. More generally, understanding and predicting the spread of multihost pathogens requires studying multiple taxa across space and time

    Energy dependence of charged pion, proton and anti-proton transverse momentum spectra for Au+Au collisions at \sqrt{s_NN} = 62.4 and 200 GeV

    Full text link
    We study the energy dependence of the transverse momentum (pT) spectra for charged pions, protons and anti-protons for Au+Au collisions at \sqrt{s_NN} = 62.4 and 200 GeV. Data are presented at mid-rapidity (|y| < 0.5) for 0.2 < pT < 12 GeV/c. In the intermediate pT region (2 < pT < 6 GeV/c), the nuclear modification factor is higher at 62.4 GeV than at 200 GeV, while at higher pT (pT >7 GeV/c) the modification is similar for both energies. The p/pi+ and pbar/pi- ratios for central collisions at \sqrt{s_NN} = 62.4 GeV peak at pT ~ 2 GeV/c. In the pT range where recombination is expected to dominate, the p/pi+ ratios at 62.4 GeV are larger than at 200 GeV, while the pbar/pi- ratios are smaller. For pT > 2 GeV/c, the pbar/pi- ratios at the two beam energies are independent of pT and centrality indicating that the dependence of the pbar/pi- ratio on pT does not change between 62.4 and 200 GeV. These findings challenge various models incorporating jet quenching and/or constituent quark coalescence.Comment: 19 pages and 6 figure

    Global age-sex-specific fertility, mortality, healthy life expectancy (HALE), and population estimates in 204 countries and territories, 1950–2019: a comprehensive demographic analysis for the Global Burden of Disease Study 2019

    Get PDF
    Background: Accurate and up-to-date assessment of demographic metrics is crucial for understanding a wide range of social, economic, and public health issues that affect populations worldwide. The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2019 produced updated and comprehensive demographic assessments of the key indicators of fertility, mortality, migration, and population for 204 countries and territories and selected subnational locations from 1950 to 2019. Methods: 8078 country-years of vital registration and sample registration data, 938 surveys, 349 censuses, and 238 other sources were identified and used to estimate age-specific fertility. Spatiotemporal Gaussian process regression (ST-GPR) was used to generate age-specific fertility rates for 5-year age groups between ages 15 and 49 years. With extensions to age groups 10–14 and 50–54 years, the total fertility rate (TFR) was then aggregated using the estimated age-specific fertility between ages 10 and 54 years. 7417 sources were used for under-5 mortality estimation and 7355 for adult mortality. ST-GPR was used to synthesise data sources after correction for known biases. Adult mortality was measured as the probability of death between ages 15 and 60 years based on vital registration, sample registration, and sibling histories, and was also estimated using ST-GPR. HIV-free life tables were then estimated using estimates of under-5 and adult mortality rates using a relational model life table system created for GBD, which closely tracks observed age-specific mortality rates from complete vital registration when available. Independent estimates of HIV-specific mortality generated by an epidemiological analysis of HIV prevalence surveys and antenatal clinic serosurveillance and other sources were incorporated into the estimates in countries with large epidemics. Annual and single-year age estimates of net migration and population for each country and territory were generated using a Bayesian hierarchical cohort component model that analysed estimated age-specific fertility and mortality rates along with 1250 censuses and 747 population registry years. We classified location-years into seven categories on the basis of the natural rate of increase in population (calculated by subtracting the crude death rate from the crude birth rate) and the net migration rate. We computed healthy life expectancy (HALE) using years lived with disability (YLDs) per capita, life tables, and standard demographic methods. Uncertainty was propagated throughout the demographic estimation process, including fertility, mortality, and population, with 1000 draw-level estimates produced for each metric. Findings: The global TFR decreased from 2•72 (95% uncertainty interval [UI] 2•66–2•79) in 2000 to 2•31 (2•17–2•46) in 2019. Global annual livebirths increased from 134•5 million (131•5–137•8) in 2000 to a peak of 139•6 million (133•0–146•9) in 2016. Global livebirths then declined to 135•3 million (127•2–144•1) in 2019. Of the 204 countries and territories included in this study, in 2019, 102 had a TFR lower than 2•1, which is considered a good approximation of replacement-level fertility. All countries in sub-Saharan Africa had TFRs above replacement level in 2019 and accounted for 27•1% (95% UI 26•4–27•8) of global livebirths. Global life expectancy at birth increased from 67•2 years (95% UI 66•8–67•6) in 2000 to 73•5 years (72•8–74•3) in 2019. The total number of deaths increased from 50•7 million (49•5–51•9) in 2000 to 56•5 million (53•7–59•2) in 2019. Under-5 deaths declined from 9•6 million (9•1–10•3) in 2000 to 5•0 million (4•3–6•0) in 2019. Global population increased by 25•7%, from 6•2 billion (6•0–6•3) in 2000 to 7•7 billion (7•5–8•0) in 2019. In 2019, 34 countries had negative natural rates of increase; in 17 of these, the population declined because immigration was not sufficient to counteract the negative rate of decline. Globally, HALE increased from 58•6 years (56•1–60•8) in 2000 to 63•5 years (60•8–66•1) in 2019. HALE increased in 202 of 204 countries and territories between 2000 and 2019. Interpretation: Over the past 20 years, fertility rates have been dropping steadily and life expectancy has been increasing, with few exceptions. Much of this change follows historical patterns linking social and economic determinants, such as those captured by the GBD Socio-demographic Index, with demographic outcomes. More recently, several countries have experienced a combination of low fertility and stagnating improvement in mortality rates, pushing more populations into the late stages of the demographic transition. Tracking demographic change and the emergence of new patterns will be essential for global health monitoring. Funding: Bill & Melinda Gates Foundation. © 2020 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY 4.0 licens

    Global burden of 87 risk factors in 204 countries and territories, 1990�2019: a systematic analysis for the Global Burden of Disease Study 2019

    Get PDF
    Background: Rigorous analysis of levels and trends in exposure to leading risk factors and quantification of their effect on human health are important to identify where public health is making progress and in which cases current efforts are inadequate. The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2019 provides a standardised and comprehensive assessment of the magnitude of risk factor exposure, relative risk, and attributable burden of disease. Methods: GBD 2019 estimated attributable mortality, years of life lost (YLLs), years of life lived with disability (YLDs), and disability-adjusted life-years (DALYs) for 87 risk factors and combinations of risk factors, at the global level, regionally, and for 204 countries and territories. GBD uses a hierarchical list of risk factors so that specific risk factors (eg, sodium intake), and related aggregates (eg, diet quality), are both evaluated. This method has six analytical steps. (1) We included 560 risk�outcome pairs that met criteria for convincing or probable evidence on the basis of research studies. 12 risk�outcome pairs included in GBD 2017 no longer met inclusion criteria and 47 risk�outcome pairs for risks already included in GBD 2017 were added based on new evidence. (2) Relative risks were estimated as a function of exposure based on published systematic reviews, 81 systematic reviews done for GBD 2019, and meta-regression. (3) Levels of exposure in each age-sex-location-year included in the study were estimated based on all available data sources using spatiotemporal Gaussian process regression, DisMod-MR 2.1, a Bayesian meta-regression method, or alternative methods. (4) We determined, from published trials or cohort studies, the level of exposure associated with minimum risk, called the theoretical minimum risk exposure level. (5) Attributable deaths, YLLs, YLDs, and DALYs were computed by multiplying population attributable fractions (PAFs) by the relevant outcome quantity for each age-sex-location-year. (6) PAFs and attributable burden for combinations of risk factors were estimated taking into account mediation of different risk factors through other risk factors. Across all six analytical steps, 30 652 distinct data sources were used in the analysis. Uncertainty in each step of the analysis was propagated into the final estimates of attributable burden. Exposure levels for dichotomous, polytomous, and continuous risk factors were summarised with use of the summary exposure value to facilitate comparisons over time, across location, and across risks. Because the entire time series from 1990 to 2019 has been re-estimated with use of consistent data and methods, these results supersede previously published GBD estimates of attributable burden. Findings: The largest declines in risk exposure from 2010 to 2019 were among a set of risks that are strongly linked to social and economic development, including household air pollution; unsafe water, sanitation, and handwashing; and child growth failure. Global declines also occurred for tobacco smoking and lead exposure. The largest increases in risk exposure were for ambient particulate matter pollution, drug use, high fasting plasma glucose, and high body-mass index. In 2019, the leading Level 2 risk factor globally for attributable deaths was high systolic blood pressure, which accounted for 10·8 million (95 uncertainty interval UI 9·51�12·1) deaths (19·2% 16·9�21·3 of all deaths in 2019), followed by tobacco (smoked, second-hand, and chewing), which accounted for 8·71 million (8·12�9·31) deaths (15·4% 14·6�16·2 of all deaths in 2019). The leading Level 2 risk factor for attributable DALYs globally in 2019 was child and maternal malnutrition, which largely affects health in the youngest age groups and accounted for 295 million (253�350) DALYs (11·6% 10·3�13·1 of all global DALYs that year). The risk factor burden varied considerably in 2019 between age groups and locations. Among children aged 0�9 years, the three leading detailed risk factors for attributable DALYs were all related to malnutrition. Iron deficiency was the leading risk factor for those aged 10�24 years, alcohol use for those aged 25�49 years, and high systolic blood pressure for those aged 50�74 years and 75 years and older. Interpretation: Overall, the record for reducing exposure to harmful risks over the past three decades is poor. Success with reducing smoking and lead exposure through regulatory policy might point the way for a stronger role for public policy on other risks in addition to continued efforts to provide information on risk factor harm to the general public. Funding: Bill & Melinda Gates Foundation. © 2020 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY 4.0 licens

    Fluid dynamics in seagrass ecology - from molecules to ecosystems

    No full text
    Fluid dynamics is the study of the movement of fluids. Among other things, it addresses velocity, acceleration, and the forces exerted by or upon fluids in motion (Daugherty et al.. 1985; White. 1999: Kundu and Cohen, 2002). Fluid dynamics affects every aspect of the existence of seagrasses from the smallest to the largest scale: from the nutrients they obtain to the sediment they colonize; from the pollination of their flowers to the import/export of organic matter to adjacent systems; from the light that reaches their leaves to the organisms that live in the seagrass habitats. Therefore, fluid dynamics is of major importance in seagrass biology, ecology, and ecophysiology. Unfortunately, fluid dynamics is often overlooked in seagrass systems (Koch, 2001). This chapter provides a general background in fluid dynamics and then addresses increasingly larger scales of fluid dynamic processes relevant to seagrass ecology and physiology: molecules (μm), leaves and shoots (mm to cm), seagrass canopies (m), sea- grass landscapes (100—1.000 m), and seagrasses as part of the biosphere (>1.000 m). Although gases are also fluids, this chapter is restricted to water (i.e. compressed fluids), how it flows through seagrasses, the forces it exerts on the plants, and the implications that this has for seagrass systems. Seagrasses are not only affected by water in motion, they also affect the currents, waves and turbulence of the water masses surrounding them. This capacity to alter their own environment is referred to as “ecosystem engineering” (Jones et al.. 1994, 1997; Thomas et al., 2000). Readers are also encouraged to consult a recent review by Okubo et al. (2002) for a discussion on flow in terrestrial and aquatic vegetation including freshwater plants, seagrasses, and kelp

    Physicians' responses to clinical decision support on an intensive care unit--comparison of four different alerting methods

    No full text
    BACKGROUND: In intensive care environments, technology is omnipresent whereby ensuring constant monitoring and the administration of critical drugs to unstable patients. A clinical decision support system (CDSS), with its widespread possibilities, can be a valuable tool in supporting adequate patient care. However, it is still unclear how decision support alerts should be presented to physicians and other medical staff to ensure that they are used most effectively. OBJECTIVE: To determine the effect of four different alert presentation methods on alert compliance after the implementation of an advanced CDSS on the intensive care unit (ICU) in our hospital. METHODS: A randomized clinical trial was executed from August 2010 till December 2011, which included all patients admitted to the ICU of our hospital. The CDSS applied contained a set of thirteen locally developed clinical rules. The percentage of alert compliance was compared for four alert presentation methods: pharmacy intervention, physician alert list, electronic health record (EHR) section and pop-up alerts. Additionally, surveys were held to determine the method most preferred by users of the CDSS. RESULTS: In the study period, the CDSS generated 902 unique alerts, primarily due to drug dosing during decreased renal function and potassium disturbances. Alert compliance was highest for recommendations offered in pop-up alerts (41%, n=68/166), followed by pharmacy intervention (33%, n=80/244), the physician alert list (20%, n=40/199) and the EHR section (19%, n=55/293). The method most preferred by clinicians was pharmacy intervention, and pop-up alerts were found suitable as well if applied correctly. The physician alert list and EHR section were not considered suitable for CDSSs in the process of this study. CONCLUSION: The alert presentation method used for CDSSs is crucial for the compliance with alerts for the clinical rules and, consequently, for the efficacy of these systems. Active alerts such as pop-ups and pharmacy intervention were more effective than passive alerts, which do not automatically appear within the clinical workflow. In this pilot study, ICU clinicians also preferred pharmacy intervention and pop-up alerts. More research is required to expand these results to other departments and other hospitals, as well as to other types of CDSSs and different alert presentation methods
    corecore